138 research outputs found

    Using the probabilistic evaluation tool for the analytical solution of large Markov models

    Get PDF
    Stochastic Petri net-based Markov modeling is a potentially very powerful and generic approach for evaluating the performance and dependability of many different systems, such as computer systems, communication networks, manufacturing systems, etc. As a consequence of their general applicability, SPN-based Markov models form the basic solution approach for several software packages that have been developed for the analytic solution of performance and dependability models. In these tools, stochastic Petri nets are used to conveniently specify complicated models, after which an automatic mapping can be carried out to an underlying Markov reward model. Subsequently, this Markov reward model is solved by specialized solution algorithms, appropriately selected for the measure of interest. One of the major aspects that hampers the use of SPN-based Markov models for the analytic solution of performance and dependability results is the size of the state space. Although typically models of up to a few hundred thousand states can conveniently be solved on modern-day work-stations, often even larger models are required to represent all the desired detail of the system. Our tool PET (probabilistic evaluation tool) circumvents problems of large state spaces when the desired performance and dependability measure are transient measures. It does so by an approach named probabilistic evaluatio

    Quantitative Evaluation of Enterprise DRM Technology

    Get PDF
    AbstractIt is of critical business importance for organizations to keep confidential digital documents secure, as the potential cost and damage incurred from the loss of confidential digital documents have increased significantly in recent years. Digital Rights Management (DRM) was developed to help organizations keep digital documents secure, as one of many digital information security solutions.In this study, the functions of eight popular DRM products currently available on the market are reviewed, and the impact of using of these DRM products is evaluated quantitatively. A group of metrics is defined reflecting the potential costs and impact to the organization incurred by implementing DRM products. Stochastic models are used to quantitatively evaluate the costs and impact of implementing a particular DRM product. In this study, it is found that although DRM products protect digital assets by encryption and by providing central control on information within the organization, this comes at a cost, since these security mechanisms typically reduce the productivity of the staff. The reduction in productivity is in turn measured in the form of non-productive time (NPT) which is an inherent part of the stochastic modeling process

    Blockchain-based Smart Contracts: A Systematic Mapping Study

    Full text link
    An appealing feature of blockchain technology is smart contracts. A smart contract is executable code that runs on top of the blockchain to facilitate, execute and enforce an agreement between untrusted parties without the involvement of a trusted third party. In this paper, we conduct a systematic mapping study to collect all research that is relevant to smart contracts from a technical perspective. The aim of doing so is to identify current research topics and open challenges for future studies in smart contract research. We extract 24 papers from different scientific databases. The results show that about two thirds of the papers focus on identifying and tackling smart contract issues. Four key issues are identified, namely, codifying, security, privacy and performance issues. The rest of the papers focuses on smart contract applications or other smart contract related topics. Research gaps that need to be addressed in future studies are provided.Comment: Keywords: Blockchain, Smart contracts, Systematic Mapping Study, Surve

    Risks of Offline Verify PIN on Contactless Cards

    Get PDF
    Contactless card payments are being introduced around the world al- lowing customers to use a card to pay for small purchases by simply placing the card onto the Point of Sale terminal. Contactless transactions do not require veri- fication of the cardholder’s PIN. However our research has found the redundant verify PIN functionality is present on the most commonly issued contactless credit and debit cards currently in circulation in the UK. This paper presents a plausible attack scenario which exploits contactless verify PIN to give unlimited attempts to guess the cardholder’s PIN without their knowledge. It also gives experimental data to demonstrate the practical viability of the attack as well as references to support our argument that contactless verify PIN is redundant functionality which compromises the security of payment cards and the cardholder

    Performance of a connectionless protocol over ATM

    Get PDF
    Recent studies show the existence of a demand for a connectionless broadband service. In order to cope with this demand, a connectionless protocol for the B-ISDN needs to be designed. Such a protocol should make use of ATM and the ATM Adaptation Layer. It needs to specify destination and bandwidth of connections to the ATM network without advance knowledge of the traffic that has to be transferred over these connection. A possible mechanism which can cope with this problem, the 'On-demand Connection with Delayed Release' (OCDR) mechanism, is described. Its eficient operation is based on the assumption that there exists a certain correlation between subsequently arriving CL packets. Two different arrival processes are used to evaluate the performance of the OCDR mechanism: a Poisson arrival process, and a Markov Modulated Poisson Process (MMPP) which models a bursty trafic source. Markov models of the OCDR mechanism have been constructed for both arrival processes. For the madel with Poisson arrivals, a closed form solution is presented. The model with MMPP arrivals is solved numerically.\ud Compared to a 'Permanent Connection' mechanism significant bandwidth reductions can be obtained provided that the offered trafic has a bursty nature. Furthermore, the OCDR mechanism has the advantageous property that the obtained average node delay is not strongly related to the intensity and burstiness of the offered trafic

    A method for analyzing the performance aspects of the fault-tolerance mechanisms in FDDI

    Get PDF
    The ability of error recovery mechanisms to make the Fiber Distributed Data Interface (FDDI) satisfy real-time performance constraints in the presence of errors is analyzed. A complicating factor in these analyses is the rarity of the error occurrences, which makes direct simulation unattractive. Therefore, a fast simulation technique, called injection simulation, which makes it possible to analyze the performance of FDDI, including its fault tolerance behavior, was developed. The implementation of injection simulation for polling models of FDDI is discussed, along with simulation result

    Betrayal, Distrust, and Rationality: Smart Counter-Collusion Contracts for Verifiable Cloud Computing

    Get PDF
    Cloud computing has become an irreversible trend. Together comes the pressing need for verifiability, to assure the client the correctness of computation outsourced to the cloud. Existing verifiable computation techniques all have a high overhead, thus if being deployed in the clouds, would render cloud computing more expensive than the on-premises counterpart. To achieve verifiability at a reasonable cost, we leverage game theory and propose a smart contract based solution. In a nutshell, a client lets two clouds compute the same task, and uses smart contracts to stimulate tension, betrayal and distrust between the clouds, so that rational clouds will not collude and cheat. In the absence of collusion, verification of correctness can be done easily by crosschecking the results from the two clouds. We provide a formal analysis of the games induced by the contracts, and prove that the contracts will be effective under certain reasonable assumptions. By resorting to game theory and smart contracts, we are able to avoid heavy cryptographic protocols. The client only needs to pay two clouds to compute in the clear, and a small transaction fee to use the smart contracts. We also conducted a feasibility study that involves implementing the contracts in Solidity and running them on the official Ethereum network.Comment: Published in ACM CCS 2017, this is the full version with all appendice

    10292 Abstracts Collection and Summary -- Resilience Assessment and Evaluation

    Get PDF
    From July 18 to July 23, 2010 the Dagstuhl Seminar 10292 ``Resilience Assessment and Evaluation \u27\u27 was held in Schloss Dagstuhl~--~Leibniz Center for Informatics. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first section describes the seminar topics and goals in general. Links to extended abstracts or full papers are provided, if available
    • 

    corecore